Rank Ordered Autoencoders

نویسنده

  • Paul Bertens
چکیده

A new method for the unsupervised learning of sparse representations using autoencoders is proposed and implemented by ordering the output of the hidden units by their activation value and progressively reconstructing the input in this order. This can be done efficiently in parallel with the use of cumulative sums and sorting only slightly increasing the computational costs. Minimizing the difference of this progressive reconstruction with respect to the input can be seen as minimizing the number of active output units required for the reconstruction of the input. The model thus learns to reconstruct optimally using the least number of active output units. This leads to high sparsity without the need for extra hyperparameters, the amount of sparsity is instead implicitly learned by minimizing this progressive reconstruction error. Results of the trained model are given for patches of the CIFAR10 dataset, showing rapid convergence of features and extremely sparse output activations while maintaining a minimal reconstruction error and showing extreme robustness to overfitting. Additionally the reconstruction as function of number of active units is presented which shows the autoencoder learns a rank order code over the input where the highest ranked units correspond to the highest decrease in reconstruction error.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Rank-Ordered Nested Multinomial Logit Model and D-Optimal Design for this Model

In contrast to the classical discrete choice experiment, the respondent in a rank-order discrete choice experiment, is asked to rank a number of alternatives instead of the preferred one. In this paper, we study the information matrix of a rank order nested multinomial logit model (RO.NMNL) and introduce local D-optimality criterion, then we obtain Locally D-optimal design for RO.NMNL models in...

متن کامل

The Valuation Difference Rank of an Ordered Difference Field

There are several equivalent characterizations of the valuation rank of an ordered field (endowed with its natural valuation). In this paper, we extend the theory to the case of an ordered difference field and introduce the notion of difference rank. We characterize the difference rank as the quotient modulo the equivalence relation naturally induced by the automorphism (which encodes its growt...

متن کامل

Spectral Bisection Tree Guided Deep Adaptive Exemplar Autoencoder for Unsupervised Domain Adaptation

Learning with limited labeled data is always a challenge in AI problems, and one of promising ways is transferring wellestablished source domain knowledge to the target domain, i.e., domain adaptation. In this paper, we extend the deep representation learning to domain adaptation scenario, and propose a novel deep model called “Deep Adaptive Exemplar AutoEncoder (DAE)”. Different from conventio...

متن کامل

A Connection Between Score Matching and Denoising Autoencoders

Denoising autoencoders have been previously shown to be competitive alternatives to restricted Boltzmann machines for unsupervised pretraining of each layer of a deep architecture. We show that a simple denoising autoencoder training criterion is equivalent to matching the score (with respect to the data) of a specific energy-based model to that of a nonparametric Parzen density estimator of th...

متن کامل

Finitely Generated Rank-Ordered Sets as a Model for Type: Type

The collection of isomorphism classes of nitely generated rank-ordered sets is shown to be a nitely generated rank-ordered set again. This is used to construct a model of the simply typed lambda calculus extended by the assumption Type: Type. Beside this, the structure of rank-ordered sets is studied. They can be represented as inverse limits of !-cochains of substructures, each being a retract...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1605.01749  شماره 

صفحات  -

تاریخ انتشار 2016